The Hoek–Brown failure criterion is an empirical stress surface that is used in rock mechanics to predict the failure of rock[1][2]. The original version of the Hoek–Brown criterion was developed by Evert Hoek and E. T. Brown in 1980 for the design of underground excavations[3]. In 1988, the criterion was extended for applicability to slope stability and surface excavation problems[4]. An update of the criterion was presented in 2002 that included improvements in the correlation between the model parameters and the geological strength index (GSI)[5].
The basic idea of the Hoek–Brown criterion was to start with the properties intact rock and to add factors to reduce those properties because of the existence of joints in the rock[4]. Although a similar criterion for concrete had been developed in 1936, the significant tool that the Hoek–Brown criterion gave design engineers was a quantification of the relation between the stress state and Bieniawski's rock mass rating (RMR)[6]. The Hoek–Brown failure criterion is used widely in mining engineering design.
Contents |
The Hoek–Brown criterion has the form[2]
where is the effective maximum principal stress, is the effective minimum principal stress, and are materials constants. In terms of the maximum normal stress () and maximum shear stress ()
where
We can convert the above relation into a form similar to the Mohr–Coulomb failure criterion by solving for to get
The material constants are related to the unconfined compressive () and tensile strengths () by[2]
If we set in the above equation, we get the pure shear Hoek–Brown criterion:
The two values of are unsymmetric with respect to the axis in the -plane. This feature of the Hoek–Brown criterion appears unphysical[2] and care must be exercised when using this criterion in numerical simulations.